Coordinate Descent Algorithm for Ramp Loss Linear Programming Support Vector Machines

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ramp loss linear programming support vector machine

The ramp loss is a robust but non-convex loss for classification. Compared with other non-convex losses, a local minimum of the ramp loss can be effectively found. The effectiveness of local search comes from the piecewise linearity of the ramp loss. Motivated by the fact that the `1-penalty is piecewise linear as well, the `1-penalty is applied for the ramp loss, resulting in a ramp loss linea...

متن کامل

Coordinate Descent Method for Large-scale L2-loss Linear Support Vector Machines

Linear support vector machines (SVM) are useful for classifying large-scale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem w...

متن کامل

Linear programming support vector machines

Based on the analysis of the conclusions in the statistical learning theory, especially the VC dimension of linear functions, linear programming support vector machines (or SVMs) are presented including linear programming linear and nonlinear SVMs. In linear programming SVMs, in order to improve the speed of the training time, the bound of the VC dimension is loosened properly. Simulation resul...

متن کامل

Heuristic approaches for support vector machines with the ramp loss

Recently, Support Vector Machines with the ramp loss (RLM) have attracted attention from the computational point of view. In this technical note, we propose two heuristics, the first one based on solving the continuous relaxation of a Mixed Integer Nonlinear formulation of the RLM and the second one based on the training of an SVM classifier on a reduced dataset identified by an integer linear ...

متن کامل

Two-variable Block Dual Coordinate Descent Methods for Large-scale Linear Support Vector Machines

Coordinate descent (CD) methods have been a state-of-the-art technique for training largescale linear SVM. The most used setting is to solve the dual problem of an SVM formulation without the bias term, for which the CD procedure of updating one variable at a time is very simple and easy to implement. In this work, we extend the one-variable setting to use two variables at each CD step. The ext...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Processing Letters

سال: 2015

ISSN: 1370-4621,1573-773X

DOI: 10.1007/s11063-015-9456-z